In [ ]:
# Use DeepSurv from the repo
import sys
sys.path.append('../deepsurv')
import deep_surv

from deepsurv_logger import DeepSurvLogger, TensorboardLogger
import utils
import viz

import numpy as np
import pandas as pd

import lasagne
import matplotlib
import matplotlib.pyplot as plt
%matplotlib inline

Read in dataset

First, I read in the dataset and print the first five elements to get a sense of what the dataset looks like


In [21]:
train_dataset_fp = './Data_for_Jared_True_CSV.txt'
train_df = pd.read_csv(dataset_fp)
train_df.head()


Out[21]:
Variable_1 Variable_2 Variable_3 Variable_4 Event Time
0 0 3 2 4.6 1 43
1 0 2 0 1.6 0 52
2 0 3 0 3.5 1 73
3 0 3 1 5.1 0 51
4 0 2 0 1.7 0 51

Transform the dataset to "DeepSurv" format

DeepSurv expects a dataset to be in the form:

{
    'x': numpy array of float32
    'e': numpy array of int32
    't': numpy array of float32
    'hr': (optional) numpy array of float32
}

You are providing me a csv, which I read in as a pandas dataframe. Then I convert the pandas dataframe into the DeepSurv dataset format above.


In [22]:
# event_col is the header in the df that represents the 'Event / Status' indicator
# time_col is the header in the df that represents the event time
def dataframe_to_deepsurv_ds(df, event_col = 'Event', time_col = 'Time'):
    # Extract the event and time columns as numpy arrays
    e = df[event_col].values.astype(np.int32)
    t = df[time_col].values.astype(np.float32)

    # Extract the patient's covariates as a numpy array
    x_df = df.drop([event_col, time_col], axis = 1)
    x = x_df.values.astype(np.float32)
    
    # Return the deep surv dataframe
    return {
        'x' : x,
        'e' : e,
        't' : t
    }

# If the headers of the csv change, you can replace the values of 
# 'event_col' and 'time_col' with the names of the new headers
# You can also use this function on your training dataset, validation dataset, and testing dataset
train_data = dataframe_to_deepsurv_ds(train_df, event_col = 'Event', time_col= 'Time')

Now once you have your dataset all formatted, define you hyper_parameters as a Python dictionary. I'll provide you with some example hyper-parameters, but you should replace the values once you tune them to your specific dataset


In [23]:
hyperparams = {
    'L2_reg': 10.0,
    'batch_norm': True,
    'dropout': 0.4,
    'hidden_layers_sizes': [25, 25],
    'learning_rate': 1e-05,
    'lr_decay': 0.001,
    'momentum': 0.9,
    'n_in': train_data['x'].shape[1],
    'standardize': True
}

Once you prepared your dataset, and defined your hyper-parameters. Now it's time to train DeepSurv!


In [28]:
# Create an instance of DeepSurv using the hyperparams defined above
model = deep_surv.DeepSurv(**hyperparams)

# DeepSurv can now leverage TensorBoard to monitor training and validation
# This section of code is optional. If you don't want to use the tensorboard logger
# Uncomment the below line, and comment out the other three lines: 
# logger = None

experiment_name = 'test_experiment_sebastian'
logdir = './logs/tensorboard/'
logger = TensorboardLogger(experiment_name, logdir=logdir)

# Now we train the model
update_fn=lasagne.updates.nesterov_momentum # The type of optimizer to use. \
                                            # Check out http://lasagne.readthedocs.io/en/latest/modules/updates.html \
                                            # for other optimizers to use
n_epochs = 2000

# If you have validation data, you can add it as the second parameter to the function
metrics = model.train(train_data, n_epochs=n_epochs, logger=logger, update_fn=update_fn)


Standardizing x with
2017-06-09 00:44:54,714 - Training step 0/2000    |                         | - loss: 1501.5049 - ci: 0.5311
2017-06-09 00:44:54,714 - Training step 0/2000    |                         | - loss: 1501.5049 - ci: 0.5311
2017-06-09 00:44:54,714 - Training step 0/2000    |                         | - loss: 1501.5049 - ci: 0.5311
2017-06-09 00:44:56,302 - Training step 10/2000   |                         | - loss: 1478.5582 - ci: 0.6236
2017-06-09 00:44:56,302 - Training step 10/2000   |                         | - loss: 1478.5582 - ci: 0.6236
2017-06-09 00:44:56,302 - Training step 10/2000   |                         | - loss: 1478.5582 - ci: 0.6236
2017-06-09 00:44:57,804 - Training step 20/2000   |                         | - loss: 1389.9281 - ci: 0.6942
2017-06-09 00:44:57,804 - Training step 20/2000   |                         | - loss: 1389.9281 - ci: 0.6942
2017-06-09 00:44:57,804 - Training step 20/2000   |                         | - loss: 1389.9281 - ci: 0.6942
2017-06-09 00:44:59,229 - Training step 30/2000   |                         | - loss: 1361.9044 - ci: 0.7034
2017-06-09 00:44:59,229 - Training step 30/2000   |                         | - loss: 1361.9044 - ci: 0.7034
2017-06-09 00:44:59,229 - Training step 30/2000   |                         | - loss: 1361.9044 - ci: 0.7034
2017-06-09 00:45:00,685 - Training step 40/2000   |                         | - loss: 1363.5573 - ci: 0.7058
2017-06-09 00:45:00,685 - Training step 40/2000   |                         | - loss: 1363.5573 - ci: 0.7058
2017-06-09 00:45:00,685 - Training step 40/2000   |                         | - loss: 1363.5573 - ci: 0.7058
2017-06-09 00:45:02,203 - Training step 50/2000   |                         | - loss: 1363.1553 - ci: 0.7111
2017-06-09 00:45:02,203 - Training step 50/2000   |                         | - loss: 1363.1553 - ci: 0.7111
2017-06-09 00:45:02,203 - Training step 50/2000   |                         | - loss: 1363.1553 - ci: 0.7111
2017-06-09 00:45:04,015 - Training step 60/2000   |                         | - loss: 1327.0909 - ci: 0.7124
2017-06-09 00:45:04,015 - Training step 60/2000   |                         | - loss: 1327.0909 - ci: 0.7124
2017-06-09 00:45:04,015 - Training step 60/2000   |                         | - loss: 1327.0909 - ci: 0.7124
2017-06-09 00:45:05,432 - Training step 70/2000   |                         | - loss: 1331.9385 - ci: 0.7112
2017-06-09 00:45:05,432 - Training step 70/2000   |                         | - loss: 1331.9385 - ci: 0.7112
2017-06-09 00:45:05,432 - Training step 70/2000   |                         | - loss: 1331.9385 - ci: 0.7112
2017-06-09 00:45:07,316 - Training step 80/2000   |*                        | - loss: 1331.4475 - ci: 0.7119
2017-06-09 00:45:07,316 - Training step 80/2000   |*                        | - loss: 1331.4475 - ci: 0.7119
2017-06-09 00:45:07,316 - Training step 80/2000   |*                        | - loss: 1331.4475 - ci: 0.7119
2017-06-09 00:45:09,050 - Training step 90/2000   |*                        | - loss: 1324.7543 - ci: 0.7115
2017-06-09 00:45:09,050 - Training step 90/2000   |*                        | - loss: 1324.7543 - ci: 0.7115
2017-06-09 00:45:09,050 - Training step 90/2000   |*                        | - loss: 1324.7543 - ci: 0.7115
2017-06-09 00:45:10,740 - Training step 100/2000  |*                        | - loss: 1328.7374 - ci: 0.7110
2017-06-09 00:45:10,740 - Training step 100/2000  |*                        | - loss: 1328.7374 - ci: 0.7110
2017-06-09 00:45:10,740 - Training step 100/2000  |*                        | - loss: 1328.7374 - ci: 0.7110
2017-06-09 00:45:12,405 - Training step 110/2000  |*                        | - loss: 1325.3510 - ci: 0.7111
2017-06-09 00:45:12,405 - Training step 110/2000  |*                        | - loss: 1325.3510 - ci: 0.7111
2017-06-09 00:45:12,405 - Training step 110/2000  |*                        | - loss: 1325.3510 - ci: 0.7111
2017-06-09 00:45:14,015 - Training step 120/2000  |*                        | - loss: 1319.7078 - ci: 0.7111
2017-06-09 00:45:14,015 - Training step 120/2000  |*                        | - loss: 1319.7078 - ci: 0.7111
2017-06-09 00:45:14,015 - Training step 120/2000  |*                        | - loss: 1319.7078 - ci: 0.7111
2017-06-09 00:45:15,562 - Training step 130/2000  |*                        | - loss: 1324.7845 - ci: 0.7111
2017-06-09 00:45:15,562 - Training step 130/2000  |*                        | - loss: 1324.7845 - ci: 0.7111
2017-06-09 00:45:15,562 - Training step 130/2000  |*                        | - loss: 1324.7845 - ci: 0.7111
2017-06-09 00:45:16,970 - Training step 140/2000  |*                        | - loss: 1309.3800 - ci: 0.7110
2017-06-09 00:45:16,970 - Training step 140/2000  |*                        | - loss: 1309.3800 - ci: 0.7110
2017-06-09 00:45:16,970 - Training step 140/2000  |*                        | - loss: 1309.3800 - ci: 0.7110
2017-06-09 00:45:18,537 - Training step 150/2000  |*                        | - loss: 1318.7681 - ci: 0.7115
2017-06-09 00:45:18,537 - Training step 150/2000  |*                        | - loss: 1318.7681 - ci: 0.7115
2017-06-09 00:45:18,537 - Training step 150/2000  |*                        | - loss: 1318.7681 - ci: 0.7115
2017-06-09 00:45:20,307 - Training step 160/2000  |**                       | - loss: 1312.8696 - ci: 0.7112
2017-06-09 00:45:20,307 - Training step 160/2000  |**                       | - loss: 1312.8696 - ci: 0.7112
2017-06-09 00:45:20,307 - Training step 160/2000  |**                       | - loss: 1312.8696 - ci: 0.7112
2017-06-09 00:45:22,238 - Training step 170/2000  |**                       | - loss: 1314.8079 - ci: 0.7113
2017-06-09 00:45:22,238 - Training step 170/2000  |**                       | - loss: 1314.8079 - ci: 0.7113
2017-06-09 00:45:22,238 - Training step 170/2000  |**                       | - loss: 1314.8079 - ci: 0.7113
2017-06-09 00:45:24,460 - Training step 180/2000  |**                       | - loss: 1308.8923 - ci: 0.7118
2017-06-09 00:45:24,460 - Training step 180/2000  |**                       | - loss: 1308.8923 - ci: 0.7118
2017-06-09 00:45:24,460 - Training step 180/2000  |**                       | - loss: 1308.8923 - ci: 0.7118
2017-06-09 00:45:26,134 - Training step 190/2000  |**                       | - loss: 1318.2069 - ci: 0.7122
2017-06-09 00:45:26,134 - Training step 190/2000  |**                       | - loss: 1318.2069 - ci: 0.7122
2017-06-09 00:45:26,134 - Training step 190/2000  |**                       | - loss: 1318.2069 - ci: 0.7122
2017-06-09 00:45:27,763 - Training step 200/2000  |**                       | - loss: 1315.4109 - ci: 0.7124
2017-06-09 00:45:27,763 - Training step 200/2000  |**                       | - loss: 1315.4109 - ci: 0.7124
2017-06-09 00:45:27,763 - Training step 200/2000  |**                       | - loss: 1315.4109 - ci: 0.7124
2017-06-09 00:45:29,152 - Training step 210/2000  |**                       | - loss: 1320.3564 - ci: 0.7126
2017-06-09 00:45:29,152 - Training step 210/2000  |**                       | - loss: 1320.3564 - ci: 0.7126
2017-06-09 00:45:29,152 - Training step 210/2000  |**                       | - loss: 1320.3564 - ci: 0.7126
2017-06-09 00:45:30,755 - Training step 220/2000  |**                       | - loss: 1321.5094 - ci: 0.7124
2017-06-09 00:45:30,755 - Training step 220/2000  |**                       | - loss: 1321.5094 - ci: 0.7124
2017-06-09 00:45:30,755 - Training step 220/2000  |**                       | - loss: 1321.5094 - ci: 0.7124
2017-06-09 00:45:32,497 - Training step 230/2000  |**                       | - loss: 1316.7141 - ci: 0.7127
2017-06-09 00:45:32,497 - Training step 230/2000  |**                       | - loss: 1316.7141 - ci: 0.7127
2017-06-09 00:45:32,497 - Training step 230/2000  |**                       | - loss: 1316.7141 - ci: 0.7127
2017-06-09 00:45:33,980 - Training step 240/2000  |***                      | - loss: 1308.3854 - ci: 0.7117
2017-06-09 00:45:33,980 - Training step 240/2000  |***                      | - loss: 1308.3854 - ci: 0.7117
2017-06-09 00:45:33,980 - Training step 240/2000  |***                      | - loss: 1308.3854 - ci: 0.7117
2017-06-09 00:45:35,628 - Training step 250/2000  |***                      | - loss: 1327.4872 - ci: 0.7125
2017-06-09 00:45:35,628 - Training step 250/2000  |***                      | - loss: 1327.4872 - ci: 0.7125
2017-06-09 00:45:35,628 - Training step 250/2000  |***                      | - loss: 1327.4872 - ci: 0.7125
2017-06-09 00:45:37,138 - Training step 260/2000  |***                      | - loss: 1302.3836 - ci: 0.7128
2017-06-09 00:45:37,138 - Training step 260/2000  |***                      | - loss: 1302.3836 - ci: 0.7128
2017-06-09 00:45:37,138 - Training step 260/2000  |***                      | - loss: 1302.3836 - ci: 0.7128
2017-06-09 00:45:38,510 - Training step 270/2000  |***                      | - loss: 1317.4918 - ci: 0.7130
2017-06-09 00:45:38,510 - Training step 270/2000  |***                      | - loss: 1317.4918 - ci: 0.7130
2017-06-09 00:45:38,510 - Training step 270/2000  |***                      | - loss: 1317.4918 - ci: 0.7130
2017-06-09 00:45:40,011 - Training step 280/2000  |***                      | - loss: 1309.9626 - ci: 0.7128
2017-06-09 00:45:40,011 - Training step 280/2000  |***                      | - loss: 1309.9626 - ci: 0.7128
2017-06-09 00:45:40,011 - Training step 280/2000  |***                      | - loss: 1309.9626 - ci: 0.7128
2017-06-09 00:45:41,524 - Training step 290/2000  |***                      | - loss: 1317.1891 - ci: 0.7131
2017-06-09 00:45:41,524 - Training step 290/2000  |***                      | - loss: 1317.1891 - ci: 0.7131
2017-06-09 00:45:41,524 - Training step 290/2000  |***                      | - loss: 1317.1891 - ci: 0.7131
2017-06-09 00:45:42,907 - Training step 300/2000  |***                      | - loss: 1315.3977 - ci: 0.7133
2017-06-09 00:45:42,907 - Training step 300/2000  |***                      | - loss: 1315.3977 - ci: 0.7133
2017-06-09 00:45:42,907 - Training step 300/2000  |***                      | - loss: 1315.3977 - ci: 0.7133
2017-06-09 00:45:44,429 - Training step 310/2000  |***                      | - loss: 1316.5371 - ci: 0.7129
2017-06-09 00:45:44,429 - Training step 310/2000  |***                      | - loss: 1316.5371 - ci: 0.7129
2017-06-09 00:45:44,429 - Training step 310/2000  |***                      | - loss: 1316.5371 - ci: 0.7129
2017-06-09 00:45:46,005 - Training step 320/2000  |****                     | - loss: 1311.9975 - ci: 0.7136
2017-06-09 00:45:46,005 - Training step 320/2000  |****                     | - loss: 1311.9975 - ci: 0.7136
2017-06-09 00:45:46,005 - Training step 320/2000  |****                     | - loss: 1311.9975 - ci: 0.7136
2017-06-09 00:45:47,560 - Training step 330/2000  |****                     | - loss: 1319.2411 - ci: 0.7133
2017-06-09 00:45:47,560 - Training step 330/2000  |****                     | - loss: 1319.2411 - ci: 0.7133
2017-06-09 00:45:47,560 - Training step 330/2000  |****                     | - loss: 1319.2411 - ci: 0.7133
2017-06-09 00:45:48,938 - Training step 340/2000  |****                     | - loss: 1314.4581 - ci: 0.7136
2017-06-09 00:45:48,938 - Training step 340/2000  |****                     | - loss: 1314.4581 - ci: 0.7136
2017-06-09 00:45:48,938 - Training step 340/2000  |****                     | - loss: 1314.4581 - ci: 0.7136
2017-06-09 00:45:50,499 - Training step 350/2000  |****                     | - loss: 1300.1730 - ci: 0.7133
2017-06-09 00:45:50,499 - Training step 350/2000  |****                     | - loss: 1300.1730 - ci: 0.7133
2017-06-09 00:45:50,499 - Training step 350/2000  |****                     | - loss: 1300.1730 - ci: 0.7133
2017-06-09 00:45:52,190 - Training step 360/2000  |****                     | - loss: 1312.1615 - ci: 0.7137
2017-06-09 00:45:52,190 - Training step 360/2000  |****                     | - loss: 1312.1615 - ci: 0.7137
2017-06-09 00:45:52,190 - Training step 360/2000  |****                     | - loss: 1312.1615 - ci: 0.7137
2017-06-09 00:45:53,702 - Training step 370/2000  |****                     | - loss: 1313.1401 - ci: 0.7138
2017-06-09 00:45:53,702 - Training step 370/2000  |****                     | - loss: 1313.1401 - ci: 0.7138
2017-06-09 00:45:53,702 - Training step 370/2000  |****                     | - loss: 1313.1401 - ci: 0.7138
2017-06-09 00:45:55,376 - Training step 380/2000  |****                     | - loss: 1299.6676 - ci: 0.7139
2017-06-09 00:45:55,376 - Training step 380/2000  |****                     | - loss: 1299.6676 - ci: 0.7139
2017-06-09 00:45:55,376 - Training step 380/2000  |****                     | - loss: 1299.6676 - ci: 0.7139
2017-06-09 00:45:56,958 - Training step 390/2000  |****                     | - loss: 1308.9985 - ci: 0.7132
2017-06-09 00:45:56,958 - Training step 390/2000  |****                     | - loss: 1308.9985 - ci: 0.7132
2017-06-09 00:45:56,958 - Training step 390/2000  |****                     | - loss: 1308.9985 - ci: 0.7132
2017-06-09 00:45:58,573 - Training step 400/2000  |*****                    | - loss: 1314.8281 - ci: 0.7137
2017-06-09 00:45:58,573 - Training step 400/2000  |*****                    | - loss: 1314.8281 - ci: 0.7137
2017-06-09 00:45:58,573 - Training step 400/2000  |*****                    | - loss: 1314.8281 - ci: 0.7137
2017-06-09 00:46:00,012 - Training step 410/2000  |*****                    | - loss: 1309.8234 - ci: 0.7143
2017-06-09 00:46:00,012 - Training step 410/2000  |*****                    | - loss: 1309.8234 - ci: 0.7143
2017-06-09 00:46:00,012 - Training step 410/2000  |*****                    | - loss: 1309.8234 - ci: 0.7143
2017-06-09 00:46:01,531 - Training step 420/2000  |*****                    | - loss: 1310.8387 - ci: 0.7139
2017-06-09 00:46:01,531 - Training step 420/2000  |*****                    | - loss: 1310.8387 - ci: 0.7139
2017-06-09 00:46:01,531 - Training step 420/2000  |*****                    | - loss: 1310.8387 - ci: 0.7139
2017-06-09 00:46:03,041 - Training step 430/2000  |*****                    | - loss: 1307.7185 - ci: 0.7143
2017-06-09 00:46:03,041 - Training step 430/2000  |*****                    | - loss: 1307.7185 - ci: 0.7143
2017-06-09 00:46:03,041 - Training step 430/2000  |*****                    | - loss: 1307.7185 - ci: 0.7143
2017-06-09 00:46:04,419 - Training step 440/2000  |*****                    | - loss: 1315.8556 - ci: 0.7141
2017-06-09 00:46:04,419 - Training step 440/2000  |*****                    | - loss: 1315.8556 - ci: 0.7141
2017-06-09 00:46:04,419 - Training step 440/2000  |*****                    | - loss: 1315.8556 - ci: 0.7141
2017-06-09 00:46:05,925 - Training step 450/2000  |*****                    | - loss: 1318.6137 - ci: 0.7144
2017-06-09 00:46:05,925 - Training step 450/2000  |*****                    | - loss: 1318.6137 - ci: 0.7144
2017-06-09 00:46:05,925 - Training step 450/2000  |*****                    | - loss: 1318.6137 - ci: 0.7144
2017-06-09 00:46:07,380 - Training step 460/2000  |*****                    | - loss: 1316.2490 - ci: 0.7148
2017-06-09 00:46:07,380 - Training step 460/2000  |*****                    | - loss: 1316.2490 - ci: 0.7148
2017-06-09 00:46:07,380 - Training step 460/2000  |*****                    | - loss: 1316.2490 - ci: 0.7148
2017-06-09 00:46:08,822 - Training step 470/2000  |*****                    | - loss: 1311.4232 - ci: 0.7144
2017-06-09 00:46:08,822 - Training step 470/2000  |*****                    | - loss: 1311.4232 - ci: 0.7144
2017-06-09 00:46:08,822 - Training step 470/2000  |*****                    | - loss: 1311.4232 - ci: 0.7144
2017-06-09 00:46:10,284 - Training step 480/2000  |******                   | - loss: 1317.0009 - ci: 0.7148
2017-06-09 00:46:10,284 - Training step 480/2000  |******                   | - loss: 1317.0009 - ci: 0.7148
2017-06-09 00:46:10,284 - Training step 480/2000  |******                   | - loss: 1317.0009 - ci: 0.7148
2017-06-09 00:46:11,874 - Training step 490/2000  |******                   | - loss: 1310.4235 - ci: 0.7146
2017-06-09 00:46:11,874 - Training step 490/2000  |******                   | - loss: 1310.4235 - ci: 0.7146
2017-06-09 00:46:11,874 - Training step 490/2000  |******                   | - loss: 1310.4235 - ci: 0.7146
2017-06-09 00:46:13,218 - Training step 500/2000  |******                   | - loss: 1303.0604 - ci: 0.7145
2017-06-09 00:46:13,218 - Training step 500/2000  |******                   | - loss: 1303.0604 - ci: 0.7145
2017-06-09 00:46:13,218 - Training step 500/2000  |******                   | - loss: 1303.0604 - ci: 0.7145
2017-06-09 00:46:14,763 - Training step 510/2000  |******                   | - loss: 1309.1406 - ci: 0.7146
2017-06-09 00:46:14,763 - Training step 510/2000  |******                   | - loss: 1309.1406 - ci: 0.7146
2017-06-09 00:46:14,763 - Training step 510/2000  |******                   | - loss: 1309.1406 - ci: 0.7146
2017-06-09 00:46:16,269 - Training step 520/2000  |******                   | - loss: 1305.7442 - ci: 0.7147
2017-06-09 00:46:16,269 - Training step 520/2000  |******                   | - loss: 1305.7442 - ci: 0.7147
2017-06-09 00:46:16,269 - Training step 520/2000  |******                   | - loss: 1305.7442 - ci: 0.7147
2017-06-09 00:46:17,861 - Training step 530/2000  |******                   | - loss: 1321.5040 - ci: 0.7148
2017-06-09 00:46:17,861 - Training step 530/2000  |******                   | - loss: 1321.5040 - ci: 0.7148
2017-06-09 00:46:17,861 - Training step 530/2000  |******                   | - loss: 1321.5040 - ci: 0.7148
2017-06-09 00:46:19,286 - Training step 540/2000  |******                   | - loss: 1308.5344 - ci: 0.7151
2017-06-09 00:46:19,286 - Training step 540/2000  |******                   | - loss: 1308.5344 - ci: 0.7151
2017-06-09 00:46:19,286 - Training step 540/2000  |******                   | - loss: 1308.5344 - ci: 0.7151
2017-06-09 00:46:20,777 - Training step 550/2000  |******                   | - loss: 1310.4202 - ci: 0.7147
2017-06-09 00:46:20,777 - Training step 550/2000  |******                   | - loss: 1310.4202 - ci: 0.7147
2017-06-09 00:46:20,777 - Training step 550/2000  |******                   | - loss: 1310.4202 - ci: 0.7147
2017-06-09 00:46:22,408 - Training step 560/2000  |*******                  | - loss: 1309.8176 - ci: 0.7141
2017-06-09 00:46:22,408 - Training step 560/2000  |*******                  | - loss: 1309.8176 - ci: 0.7141
2017-06-09 00:46:22,408 - Training step 560/2000  |*******                  | - loss: 1309.8176 - ci: 0.7141
2017-06-09 00:46:23,754 - Training step 570/2000  |*******                  | - loss: 1303.7040 - ci: 0.7143
2017-06-09 00:46:23,754 - Training step 570/2000  |*******                  | - loss: 1303.7040 - ci: 0.7143
2017-06-09 00:46:23,754 - Training step 570/2000  |*******                  | - loss: 1303.7040 - ci: 0.7143
2017-06-09 00:46:25,269 - Training step 580/2000  |*******                  | - loss: 1308.3771 - ci: 0.7145
2017-06-09 00:46:25,269 - Training step 580/2000  |*******                  | - loss: 1308.3771 - ci: 0.7145
2017-06-09 00:46:25,269 - Training step 580/2000  |*******                  | - loss: 1308.3771 - ci: 0.7145
2017-06-09 00:46:26,854 - Training step 590/2000  |*******                  | - loss: 1309.0045 - ci: 0.7148
2017-06-09 00:46:26,854 - Training step 590/2000  |*******                  | - loss: 1309.0045 - ci: 0.7148
2017-06-09 00:46:26,854 - Training step 590/2000  |*******                  | - loss: 1309.0045 - ci: 0.7148
2017-06-09 00:46:28,362 - Training step 600/2000  |*******                  | - loss: 1321.5077 - ci: 0.7150
2017-06-09 00:46:28,362 - Training step 600/2000  |*******                  | - loss: 1321.5077 - ci: 0.7150
2017-06-09 00:46:28,362 - Training step 600/2000  |*******                  | - loss: 1321.5077 - ci: 0.7150
2017-06-09 00:46:29,759 - Training step 610/2000  |*******                  | - loss: 1309.3300 - ci: 0.7149
2017-06-09 00:46:29,759 - Training step 610/2000  |*******                  | - loss: 1309.3300 - ci: 0.7149
2017-06-09 00:46:29,759 - Training step 610/2000  |*******                  | - loss: 1309.3300 - ci: 0.7149
2017-06-09 00:46:31,296 - Training step 620/2000  |*******                  | - loss: 1310.1265 - ci: 0.7150
2017-06-09 00:46:31,296 - Training step 620/2000  |*******                  | - loss: 1310.1265 - ci: 0.7150
2017-06-09 00:46:31,296 - Training step 620/2000  |*******                  | - loss: 1310.1265 - ci: 0.7150
2017-06-09 00:46:32,829 - Training step 630/2000  |*******                  | - loss: 1312.2480 - ci: 0.7150
2017-06-09 00:46:32,829 - Training step 630/2000  |*******                  | - loss: 1312.2480 - ci: 0.7150
2017-06-09 00:46:32,829 - Training step 630/2000  |*******                  | - loss: 1312.2480 - ci: 0.7150
2017-06-09 00:46:34,199 - Training step 640/2000  |********                 | - loss: 1305.7918 - ci: 0.7151
2017-06-09 00:46:34,199 - Training step 640/2000  |********                 | - loss: 1305.7918 - ci: 0.7151
2017-06-09 00:46:34,199 - Training step 640/2000  |********                 | - loss: 1305.7918 - ci: 0.7151
2017-06-09 00:46:35,686 - Training step 650/2000  |********                 | - loss: 1307.9812 - ci: 0.7151
2017-06-09 00:46:35,686 - Training step 650/2000  |********                 | - loss: 1307.9812 - ci: 0.7151
2017-06-09 00:46:35,686 - Training step 650/2000  |********                 | - loss: 1307.9812 - ci: 0.7151
2017-06-09 00:46:37,171 - Training step 660/2000  |********                 | - loss: 1316.7786 - ci: 0.7145
2017-06-09 00:46:37,171 - Training step 660/2000  |********                 | - loss: 1316.7786 - ci: 0.7145
2017-06-09 00:46:37,171 - Training step 660/2000  |********                 | - loss: 1316.7786 - ci: 0.7145
2017-06-09 00:46:38,746 - Training step 670/2000  |********                 | - loss: 1303.2462 - ci: 0.7143
2017-06-09 00:46:38,746 - Training step 670/2000  |********                 | - loss: 1303.2462 - ci: 0.7143
2017-06-09 00:46:38,746 - Training step 670/2000  |********                 | - loss: 1303.2462 - ci: 0.7143
2017-06-09 00:46:40,349 - Training step 680/2000  |********                 | - loss: 1306.9628 - ci: 0.7148
2017-06-09 00:46:40,349 - Training step 680/2000  |********                 | - loss: 1306.9628 - ci: 0.7148
2017-06-09 00:46:40,349 - Training step 680/2000  |********                 | - loss: 1306.9628 - ci: 0.7148
2017-06-09 00:46:41,841 - Training step 690/2000  |********                 | - loss: 1311.0211 - ci: 0.7154
2017-06-09 00:46:41,841 - Training step 690/2000  |********                 | - loss: 1311.0211 - ci: 0.7154
2017-06-09 00:46:41,841 - Training step 690/2000  |********                 | - loss: 1311.0211 - ci: 0.7154
2017-06-09 00:46:43,442 - Training step 700/2000  |********                 | - loss: 1312.8577 - ci: 0.7156
2017-06-09 00:46:43,442 - Training step 700/2000  |********                 | - loss: 1312.8577 - ci: 0.7156
2017-06-09 00:46:43,442 - Training step 700/2000  |********                 | - loss: 1312.8577 - ci: 0.7156
2017-06-09 00:46:44,795 - Training step 710/2000  |********                 | - loss: 1308.3466 - ci: 0.7151
2017-06-09 00:46:44,795 - Training step 710/2000  |********                 | - loss: 1308.3466 - ci: 0.7151
2017-06-09 00:46:44,795 - Training step 710/2000  |********                 | - loss: 1308.3466 - ci: 0.7151
2017-06-09 00:46:46,242 - Training step 720/2000  |*********                | - loss: 1314.3357 - ci: 0.7151
2017-06-09 00:46:46,242 - Training step 720/2000  |*********                | - loss: 1314.3357 - ci: 0.7151
2017-06-09 00:46:46,242 - Training step 720/2000  |*********                | - loss: 1314.3357 - ci: 0.7151
2017-06-09 00:46:47,763 - Training step 730/2000  |*********                | - loss: 1302.6228 - ci: 0.7154
2017-06-09 00:46:47,763 - Training step 730/2000  |*********                | - loss: 1302.6228 - ci: 0.7154
2017-06-09 00:46:47,763 - Training step 730/2000  |*********                | - loss: 1302.6228 - ci: 0.7154
2017-06-09 00:46:49,238 - Training step 740/2000  |*********                | - loss: 1319.1355 - ci: 0.7155
2017-06-09 00:46:49,238 - Training step 740/2000  |*********                | - loss: 1319.1355 - ci: 0.7155
2017-06-09 00:46:49,238 - Training step 740/2000  |*********                | - loss: 1319.1355 - ci: 0.7155
2017-06-09 00:46:50,971 - Training step 750/2000  |*********                | - loss: 1312.0689 - ci: 0.7157
2017-06-09 00:46:50,971 - Training step 750/2000  |*********                | - loss: 1312.0689 - ci: 0.7157
2017-06-09 00:46:50,971 - Training step 750/2000  |*********                | - loss: 1312.0689 - ci: 0.7157
2017-06-09 00:46:52,586 - Training step 760/2000  |*********                | - loss: 1311.4020 - ci: 0.7157
2017-06-09 00:46:52,586 - Training step 760/2000  |*********                | - loss: 1311.4020 - ci: 0.7157
2017-06-09 00:46:52,586 - Training step 760/2000  |*********                | - loss: 1311.4020 - ci: 0.7157
2017-06-09 00:46:54,225 - Training step 770/2000  |*********                | - loss: 1301.0336 - ci: 0.7156
2017-06-09 00:46:54,225 - Training step 770/2000  |*********                | - loss: 1301.0336 - ci: 0.7156
2017-06-09 00:46:54,225 - Training step 770/2000  |*********                | - loss: 1301.0336 - ci: 0.7156
2017-06-09 00:46:55,608 - Training step 780/2000  |*********                | - loss: 1311.9334 - ci: 0.7157
2017-06-09 00:46:55,608 - Training step 780/2000  |*********                | - loss: 1311.9334 - ci: 0.7157
2017-06-09 00:46:55,608 - Training step 780/2000  |*********                | - loss: 1311.9334 - ci: 0.7157
2017-06-09 00:46:57,072 - Training step 790/2000  |*********                | - loss: 1305.5237 - ci: 0.7157
2017-06-09 00:46:57,072 - Training step 790/2000  |*********                | - loss: 1305.5237 - ci: 0.7157
2017-06-09 00:46:57,072 - Training step 790/2000  |*********                | - loss: 1305.5237 - ci: 0.7157
2017-06-09 00:46:58,534 - Training step 800/2000  |**********               | - loss: 1305.0344 - ci: 0.7156
2017-06-09 00:46:58,534 - Training step 800/2000  |**********               | - loss: 1305.0344 - ci: 0.7156
2017-06-09 00:46:58,534 - Training step 800/2000  |**********               | - loss: 1305.0344 - ci: 0.7156
2017-06-09 00:46:59,910 - Training step 810/2000  |**********               | - loss: 1309.2998 - ci: 0.7157
2017-06-09 00:46:59,910 - Training step 810/2000  |**********               | - loss: 1309.2998 - ci: 0.7157
2017-06-09 00:46:59,910 - Training step 810/2000  |**********               | - loss: 1309.2998 - ci: 0.7157
2017-06-09 00:47:01,685 - Training step 820/2000  |**********               | - loss: 1307.1702 - ci: 0.7156
2017-06-09 00:47:01,685 - Training step 820/2000  |**********               | - loss: 1307.1702 - ci: 0.7156
2017-06-09 00:47:01,685 - Training step 820/2000  |**********               | - loss: 1307.1702 - ci: 0.7156
2017-06-09 00:47:03,399 - Training step 830/2000  |**********               | - loss: 1312.7261 - ci: 0.7158
2017-06-09 00:47:03,399 - Training step 830/2000  |**********               | - loss: 1312.7261 - ci: 0.7158
2017-06-09 00:47:03,399 - Training step 830/2000  |**********               | - loss: 1312.7261 - ci: 0.7158
2017-06-09 00:47:04,907 - Training step 840/2000  |**********               | - loss: 1311.6965 - ci: 0.7155
2017-06-09 00:47:04,907 - Training step 840/2000  |**********               | - loss: 1311.6965 - ci: 0.7155
2017-06-09 00:47:04,907 - Training step 840/2000  |**********               | - loss: 1311.6965 - ci: 0.7155
2017-06-09 00:47:06,431 - Training step 850/2000  |**********               | - loss: 1317.9277 - ci: 0.7156
2017-06-09 00:47:06,431 - Training step 850/2000  |**********               | - loss: 1317.9277 - ci: 0.7156
2017-06-09 00:47:06,431 - Training step 850/2000  |**********               | - loss: 1317.9277 - ci: 0.7156
2017-06-09 00:47:08,040 - Training step 860/2000  |**********               | - loss: 1303.9091 - ci: 0.7159
2017-06-09 00:47:08,040 - Training step 860/2000  |**********               | - loss: 1303.9091 - ci: 0.7159
2017-06-09 00:47:08,040 - Training step 860/2000  |**********               | - loss: 1303.9091 - ci: 0.7159
2017-06-09 00:47:09,499 - Training step 870/2000  |**********               | - loss: 1304.9647 - ci: 0.7162
2017-06-09 00:47:09,499 - Training step 870/2000  |**********               | - loss: 1304.9647 - ci: 0.7162
2017-06-09 00:47:09,499 - Training step 870/2000  |**********               | - loss: 1304.9647 - ci: 0.7162
2017-06-09 00:47:11,105 - Training step 880/2000  |***********              | - loss: 1313.2321 - ci: 0.7158
2017-06-09 00:47:11,105 - Training step 880/2000  |***********              | - loss: 1313.2321 - ci: 0.7158
2017-06-09 00:47:11,105 - Training step 880/2000  |***********              | - loss: 1313.2321 - ci: 0.7158
2017-06-09 00:47:12,664 - Training step 890/2000  |***********              | - loss: 1308.1376 - ci: 0.7160
2017-06-09 00:47:12,664 - Training step 890/2000  |***********              | - loss: 1308.1376 - ci: 0.7160
2017-06-09 00:47:12,664 - Training step 890/2000  |***********              | - loss: 1308.1376 - ci: 0.7160
2017-06-09 00:47:14,171 - Training step 900/2000  |***********              | - loss: 1301.8409 - ci: 0.7162
2017-06-09 00:47:14,171 - Training step 900/2000  |***********              | - loss: 1301.8409 - ci: 0.7162
2017-06-09 00:47:14,171 - Training step 900/2000  |***********              | - loss: 1301.8409 - ci: 0.7162
2017-06-09 00:47:15,781 - Training step 910/2000  |***********              | - loss: 1312.9708 - ci: 0.7160
2017-06-09 00:47:15,781 - Training step 910/2000  |***********              | - loss: 1312.9708 - ci: 0.7160
2017-06-09 00:47:15,781 - Training step 910/2000  |***********              | - loss: 1312.9708 - ci: 0.7160
2017-06-09 00:47:17,316 - Training step 920/2000  |***********              | - loss: 1310.2185 - ci: 0.7159
2017-06-09 00:47:17,316 - Training step 920/2000  |***********              | - loss: 1310.2185 - ci: 0.7159
2017-06-09 00:47:17,316 - Training step 920/2000  |***********              | - loss: 1310.2185 - ci: 0.7159
2017-06-09 00:47:18,840 - Training step 930/2000  |***********              | - loss: 1307.2955 - ci: 0.7157
2017-06-09 00:47:18,840 - Training step 930/2000  |***********              | - loss: 1307.2955 - ci: 0.7157
2017-06-09 00:47:18,840 - Training step 930/2000  |***********              | - loss: 1307.2955 - ci: 0.7157
2017-06-09 00:47:20,337 - Training step 940/2000  |***********              | - loss: 1313.8765 - ci: 0.7161
2017-06-09 00:47:20,337 - Training step 940/2000  |***********              | - loss: 1313.8765 - ci: 0.7161
2017-06-09 00:47:20,337 - Training step 940/2000  |***********              | - loss: 1313.8765 - ci: 0.7161
2017-06-09 00:47:21,886 - Training step 950/2000  |***********              | - loss: 1301.6431 - ci: 0.7162
2017-06-09 00:47:21,886 - Training step 950/2000  |***********              | - loss: 1301.6431 - ci: 0.7162
2017-06-09 00:47:21,886 - Training step 950/2000  |***********              | - loss: 1301.6431 - ci: 0.7162
2017-06-09 00:47:23,650 - Training step 960/2000  |************             | - loss: 1314.4310 - ci: 0.7162
2017-06-09 00:47:23,650 - Training step 960/2000  |************             | - loss: 1314.4310 - ci: 0.7162
2017-06-09 00:47:23,650 - Training step 960/2000  |************             | - loss: 1314.4310 - ci: 0.7162
2017-06-09 00:47:25,103 - Training step 970/2000  |************             | - loss: 1309.2297 - ci: 0.7163
2017-06-09 00:47:25,103 - Training step 970/2000  |************             | - loss: 1309.2297 - ci: 0.7163
2017-06-09 00:47:25,103 - Training step 970/2000  |************             | - loss: 1309.2297 - ci: 0.7163
2017-06-09 00:47:26,558 - Training step 980/2000  |************             | - loss: 1307.6507 - ci: 0.7162
2017-06-09 00:47:26,558 - Training step 980/2000  |************             | - loss: 1307.6507 - ci: 0.7162
2017-06-09 00:47:26,558 - Training step 980/2000  |************             | - loss: 1307.6507 - ci: 0.7162
2017-06-09 00:47:28,080 - Training step 990/2000  |************             | - loss: 1309.8273 - ci: 0.7161
2017-06-09 00:47:28,080 - Training step 990/2000  |************             | - loss: 1309.8273 - ci: 0.7161
2017-06-09 00:47:28,080 - Training step 990/2000  |************             | - loss: 1309.8273 - ci: 0.7161
2017-06-09 00:47:29,656 - Training step 1000/2000 |************             | - loss: 1305.8391 - ci: 0.7164
2017-06-09 00:47:29,656 - Training step 1000/2000 |************             | - loss: 1305.8391 - ci: 0.7164
2017-06-09 00:47:29,656 - Training step 1000/2000 |************             | - loss: 1305.8391 - ci: 0.7164
2017-06-09 00:47:31,035 - Training step 1010/2000 |************             | - loss: 1302.2721 - ci: 0.7163
2017-06-09 00:47:31,035 - Training step 1010/2000 |************             | - loss: 1302.2721 - ci: 0.7163
2017-06-09 00:47:31,035 - Training step 1010/2000 |************             | - loss: 1302.2721 - ci: 0.7163
2017-06-09 00:47:32,559 - Training step 1020/2000 |************             | - loss: 1323.5172 - ci: 0.7166
2017-06-09 00:47:32,559 - Training step 1020/2000 |************             | - loss: 1323.5172 - ci: 0.7166
2017-06-09 00:47:32,559 - Training step 1020/2000 |************             | - loss: 1323.5172 - ci: 0.7166
2017-06-09 00:47:34,097 - Training step 1030/2000 |************             | - loss: 1306.0175 - ci: 0.7162
2017-06-09 00:47:34,097 - Training step 1030/2000 |************             | - loss: 1306.0175 - ci: 0.7162
2017-06-09 00:47:34,097 - Training step 1030/2000 |************             | - loss: 1306.0175 - ci: 0.7162
2017-06-09 00:47:35,568 - Training step 1040/2000 |*************            | - loss: 1308.6968 - ci: 0.7166
2017-06-09 00:47:35,568 - Training step 1040/2000 |*************            | - loss: 1308.6968 - ci: 0.7166
2017-06-09 00:47:35,568 - Training step 1040/2000 |*************            | - loss: 1308.6968 - ci: 0.7166
2017-06-09 00:47:37,153 - Training step 1050/2000 |*************            | - loss: 1306.8967 - ci: 0.7158
2017-06-09 00:47:37,153 - Training step 1050/2000 |*************            | - loss: 1306.8967 - ci: 0.7158
2017-06-09 00:47:37,153 - Training step 1050/2000 |*************            | - loss: 1306.8967 - ci: 0.7158
2017-06-09 00:47:38,800 - Training step 1060/2000 |*************            | - loss: 1314.9792 - ci: 0.7164
2017-06-09 00:47:38,800 - Training step 1060/2000 |*************            | - loss: 1314.9792 - ci: 0.7164
2017-06-09 00:47:38,800 - Training step 1060/2000 |*************            | - loss: 1314.9792 - ci: 0.7164
2017-06-09 00:47:40,202 - Training step 1070/2000 |*************            | - loss: 1303.2147 - ci: 0.7166
2017-06-09 00:47:40,202 - Training step 1070/2000 |*************            | - loss: 1303.2147 - ci: 0.7166
2017-06-09 00:47:40,202 - Training step 1070/2000 |*************            | - loss: 1303.2147 - ci: 0.7166
2017-06-09 00:47:41,719 - Training step 1080/2000 |*************            | - loss: 1303.9220 - ci: 0.7156
2017-06-09 00:47:41,719 - Training step 1080/2000 |*************            | - loss: 1303.9220 - ci: 0.7156
2017-06-09 00:47:41,719 - Training step 1080/2000 |*************            | - loss: 1303.9220 - ci: 0.7156
2017-06-09 00:47:43,347 - Training step 1090/2000 |*************            | - loss: 1307.8010 - ci: 0.7166
2017-06-09 00:47:43,347 - Training step 1090/2000 |*************            | - loss: 1307.8010 - ci: 0.7166
2017-06-09 00:47:43,347 - Training step 1090/2000 |*************            | - loss: 1307.8010 - ci: 0.7166
2017-06-09 00:47:44,743 - Training step 1100/2000 |*************            | - loss: 1316.5170 - ci: 0.7163
2017-06-09 00:47:44,743 - Training step 1100/2000 |*************            | - loss: 1316.5170 - ci: 0.7163
2017-06-09 00:47:44,743 - Training step 1100/2000 |*************            | - loss: 1316.5170 - ci: 0.7163
2017-06-09 00:47:46,306 - Training step 1110/2000 |*************            | - loss: 1306.5222 - ci: 0.7165
2017-06-09 00:47:46,306 - Training step 1110/2000 |*************            | - loss: 1306.5222 - ci: 0.7165
2017-06-09 00:47:46,306 - Training step 1110/2000 |*************            | - loss: 1306.5222 - ci: 0.7165
2017-06-09 00:47:47,805 - Training step 1120/2000 |**************           | - loss: 1305.7558 - ci: 0.7163
2017-06-09 00:47:47,805 - Training step 1120/2000 |**************           | - loss: 1305.7558 - ci: 0.7163
2017-06-09 00:47:47,805 - Training step 1120/2000 |**************           | - loss: 1305.7558 - ci: 0.7163
2017-06-09 00:47:49,293 - Training step 1130/2000 |**************           | - loss: 1306.3765 - ci: 0.7166
2017-06-09 00:47:49,293 - Training step 1130/2000 |**************           | - loss: 1306.3765 - ci: 0.7166
2017-06-09 00:47:49,293 - Training step 1130/2000 |**************           | - loss: 1306.3765 - ci: 0.7166
2017-06-09 00:47:50,771 - Training step 1140/2000 |**************           | - loss: 1313.2537 - ci: 0.7170
2017-06-09 00:47:50,771 - Training step 1140/2000 |**************           | - loss: 1313.2537 - ci: 0.7170
2017-06-09 00:47:50,771 - Training step 1140/2000 |**************           | - loss: 1313.2537 - ci: 0.7170
2017-06-09 00:47:52,541 - Training step 1150/2000 |**************           | - loss: 1303.2732 - ci: 0.7168
2017-06-09 00:47:52,541 - Training step 1150/2000 |**************           | - loss: 1303.2732 - ci: 0.7168
2017-06-09 00:47:52,541 - Training step 1150/2000 |**************           | - loss: 1303.2732 - ci: 0.7168
2017-06-09 00:47:54,112 - Training step 1160/2000 |**************           | - loss: 1308.7607 - ci: 0.7165
2017-06-09 00:47:54,112 - Training step 1160/2000 |**************           | - loss: 1308.7607 - ci: 0.7165
2017-06-09 00:47:54,112 - Training step 1160/2000 |**************           | - loss: 1308.7607 - ci: 0.7165
2017-06-09 00:47:55,599 - Training step 1170/2000 |**************           | - loss: 1312.3324 - ci: 0.7168
2017-06-09 00:47:55,599 - Training step 1170/2000 |**************           | - loss: 1312.3324 - ci: 0.7168
2017-06-09 00:47:55,599 - Training step 1170/2000 |**************           | - loss: 1312.3324 - ci: 0.7168
2017-06-09 00:47:57,353 - Training step 1180/2000 |**************           | - loss: 1311.0442 - ci: 0.7168
2017-06-09 00:47:57,353 - Training step 1180/2000 |**************           | - loss: 1311.0442 - ci: 0.7168
2017-06-09 00:47:57,353 - Training step 1180/2000 |**************           | - loss: 1311.0442 - ci: 0.7168
2017-06-09 00:47:58,958 - Training step 1190/2000 |**************           | - loss: 1307.1786 - ci: 0.7166
2017-06-09 00:47:58,958 - Training step 1190/2000 |**************           | - loss: 1307.1786 - ci: 0.7166
2017-06-09 00:47:58,958 - Training step 1190/2000 |**************           | - loss: 1307.1786 - ci: 0.7166
2017-06-09 00:48:00,617 - Training step 1200/2000 |***************          | - loss: 1306.9219 - ci: 0.7166
2017-06-09 00:48:00,617 - Training step 1200/2000 |***************          | - loss: 1306.9219 - ci: 0.7166
2017-06-09 00:48:00,617 - Training step 1200/2000 |***************          | - loss: 1306.9219 - ci: 0.7166
2017-06-09 00:48:02,028 - Training step 1210/2000 |***************          | - loss: 1311.4526 - ci: 0.7168
2017-06-09 00:48:02,028 - Training step 1210/2000 |***************          | - loss: 1311.4526 - ci: 0.7168
2017-06-09 00:48:02,028 - Training step 1210/2000 |***************          | - loss: 1311.4526 - ci: 0.7168
2017-06-09 00:48:03,632 - Training step 1220/2000 |***************          | - loss: 1317.0579 - ci: 0.7170
2017-06-09 00:48:03,632 - Training step 1220/2000 |***************          | - loss: 1317.0579 - ci: 0.7170
2017-06-09 00:48:03,632 - Training step 1220/2000 |***************          | - loss: 1317.0579 - ci: 0.7170
2017-06-09 00:48:05,198 - Training step 1230/2000 |***************          | - loss: 1308.2753 - ci: 0.7166
2017-06-09 00:48:05,198 - Training step 1230/2000 |***************          | - loss: 1308.2753 - ci: 0.7166
2017-06-09 00:48:05,198 - Training step 1230/2000 |***************          | - loss: 1308.2753 - ci: 0.7166
2017-06-09 00:48:06,607 - Training step 1240/2000 |***************          | - loss: 1302.5891 - ci: 0.7170
2017-06-09 00:48:06,607 - Training step 1240/2000 |***************          | - loss: 1302.5891 - ci: 0.7170
2017-06-09 00:48:06,607 - Training step 1240/2000 |***************          | - loss: 1302.5891 - ci: 0.7170
2017-06-09 00:48:08,149 - Training step 1250/2000 |***************          | - loss: 1305.9572 - ci: 0.7171
2017-06-09 00:48:08,149 - Training step 1250/2000 |***************          | - loss: 1305.9572 - ci: 0.7171
2017-06-09 00:48:08,149 - Training step 1250/2000 |***************          | - loss: 1305.9572 - ci: 0.7171
2017-06-09 00:48:09,700 - Training step 1260/2000 |***************          | - loss: 1313.5267 - ci: 0.7170
2017-06-09 00:48:09,700 - Training step 1260/2000 |***************          | - loss: 1313.5267 - ci: 0.7170
2017-06-09 00:48:09,700 - Training step 1260/2000 |***************          | - loss: 1313.5267 - ci: 0.7170
2017-06-09 00:48:11,097 - Training step 1270/2000 |***************          | - loss: 1314.9220 - ci: 0.7171
2017-06-09 00:48:11,097 - Training step 1270/2000 |***************          | - loss: 1314.9220 - ci: 0.7171
2017-06-09 00:48:11,097 - Training step 1270/2000 |***************          | - loss: 1314.9220 - ci: 0.7171
2017-06-09 00:48:13,150 - Training step 1280/2000 |****************         | - loss: 1304.7164 - ci: 0.7172
2017-06-09 00:48:13,150 - Training step 1280/2000 |****************         | - loss: 1304.7164 - ci: 0.7172
2017-06-09 00:48:13,150 - Training step 1280/2000 |****************         | - loss: 1304.7164 - ci: 0.7172
2017-06-09 00:48:14,942 - Training step 1290/2000 |****************         | - loss: 1305.4498 - ci: 0.7172
2017-06-09 00:48:14,942 - Training step 1290/2000 |****************         | - loss: 1305.4498 - ci: 0.7172
2017-06-09 00:48:14,942 - Training step 1290/2000 |****************         | - loss: 1305.4498 - ci: 0.7172
2017-06-09 00:48:16,413 - Training step 1300/2000 |****************         | - loss: 1312.6072 - ci: 0.7170
2017-06-09 00:48:16,413 - Training step 1300/2000 |****************         | - loss: 1312.6072 - ci: 0.7170
2017-06-09 00:48:16,413 - Training step 1300/2000 |****************         | - loss: 1312.6072 - ci: 0.7170
2017-06-09 00:48:17,969 - Training step 1310/2000 |****************         | - loss: 1318.5777 - ci: 0.7166
2017-06-09 00:48:17,969 - Training step 1310/2000 |****************         | - loss: 1318.5777 - ci: 0.7166
2017-06-09 00:48:17,969 - Training step 1310/2000 |****************         | - loss: 1318.5777 - ci: 0.7166
2017-06-09 00:48:19,884 - Training step 1320/2000 |****************         | - loss: 1299.2790 - ci: 0.7164
2017-06-09 00:48:19,884 - Training step 1320/2000 |****************         | - loss: 1299.2790 - ci: 0.7164
2017-06-09 00:48:19,884 - Training step 1320/2000 |****************         | - loss: 1299.2790 - ci: 0.7164
2017-06-09 00:48:21,745 - Training step 1330/2000 |****************         | - loss: 1305.1444 - ci: 0.7164
2017-06-09 00:48:21,745 - Training step 1330/2000 |****************         | - loss: 1305.1444 - ci: 0.7164
2017-06-09 00:48:21,745 - Training step 1330/2000 |****************         | - loss: 1305.1444 - ci: 0.7164
2017-06-09 00:48:23,384 - Training step 1340/2000 |****************         | - loss: 1307.1364 - ci: 0.7168
2017-06-09 00:48:23,384 - Training step 1340/2000 |****************         | - loss: 1307.1364 - ci: 0.7168
2017-06-09 00:48:23,384 - Training step 1340/2000 |****************         | - loss: 1307.1364 - ci: 0.7168
2017-06-09 00:48:24,919 - Training step 1350/2000 |****************         | - loss: 1304.9625 - ci: 0.7166
2017-06-09 00:48:24,919 - Training step 1350/2000 |****************         | - loss: 1304.9625 - ci: 0.7166
2017-06-09 00:48:24,919 - Training step 1350/2000 |****************         | - loss: 1304.9625 - ci: 0.7166
2017-06-09 00:48:26,539 - Training step 1360/2000 |*****************        | - loss: 1314.5679 - ci: 0.7165
2017-06-09 00:48:26,539 - Training step 1360/2000 |*****************        | - loss: 1314.5679 - ci: 0.7165
2017-06-09 00:48:26,539 - Training step 1360/2000 |*****************        | - loss: 1314.5679 - ci: 0.7165
2017-06-09 00:48:27,950 - Training step 1370/2000 |*****************        | - loss: 1301.6400 - ci: 0.7165
2017-06-09 00:48:27,950 - Training step 1370/2000 |*****************        | - loss: 1301.6400 - ci: 0.7165
2017-06-09 00:48:27,950 - Training step 1370/2000 |*****************        | - loss: 1301.6400 - ci: 0.7165
2017-06-09 00:48:29,480 - Training step 1380/2000 |*****************        | - loss: 1305.9058 - ci: 0.7164
2017-06-09 00:48:29,480 - Training step 1380/2000 |*****************        | - loss: 1305.9058 - ci: 0.7164
2017-06-09 00:48:29,480 - Training step 1380/2000 |*****************        | - loss: 1305.9058 - ci: 0.7164
2017-06-09 00:48:31,083 - Training step 1390/2000 |*****************        | - loss: 1313.5641 - ci: 0.7172
2017-06-09 00:48:31,083 - Training step 1390/2000 |*****************        | - loss: 1313.5641 - ci: 0.7172
2017-06-09 00:48:31,083 - Training step 1390/2000 |*****************        | - loss: 1313.5641 - ci: 0.7172
2017-06-09 00:48:32,781 - Training step 1400/2000 |*****************        | - loss: 1303.2757 - ci: 0.7165
2017-06-09 00:48:32,781 - Training step 1400/2000 |*****************        | - loss: 1303.2757 - ci: 0.7165
2017-06-09 00:48:32,781 - Training step 1400/2000 |*****************        | - loss: 1303.2757 - ci: 0.7165
2017-06-09 00:48:34,510 - Training step 1410/2000 |*****************        | - loss: 1307.2826 - ci: 0.7169
2017-06-09 00:48:34,510 - Training step 1410/2000 |*****************        | - loss: 1307.2826 - ci: 0.7169
2017-06-09 00:48:34,510 - Training step 1410/2000 |*****************        | - loss: 1307.2826 - ci: 0.7169
2017-06-09 00:48:36,606 - Training step 1420/2000 |*****************        | - loss: 1309.8158 - ci: 0.7171
2017-06-09 00:48:36,606 - Training step 1420/2000 |*****************        | - loss: 1309.8158 - ci: 0.7171
2017-06-09 00:48:36,606 - Training step 1420/2000 |*****************        | - loss: 1309.8158 - ci: 0.7171
2017-06-09 00:48:38,431 - Training step 1430/2000 |*****************        | - loss: 1302.8519 - ci: 0.7168
2017-06-09 00:48:38,431 - Training step 1430/2000 |*****************        | - loss: 1302.8519 - ci: 0.7168
2017-06-09 00:48:38,431 - Training step 1430/2000 |*****************        | - loss: 1302.8519 - ci: 0.7168
2017-06-09 00:48:40,206 - Training step 1440/2000 |******************       | - loss: 1306.5000 - ci: 0.7167
2017-06-09 00:48:40,206 - Training step 1440/2000 |******************       | - loss: 1306.5000 - ci: 0.7167
2017-06-09 00:48:40,206 - Training step 1440/2000 |******************       | - loss: 1306.5000 - ci: 0.7167
2017-06-09 00:48:41,775 - Training step 1450/2000 |******************       | - loss: 1305.4064 - ci: 0.7171
2017-06-09 00:48:41,775 - Training step 1450/2000 |******************       | - loss: 1305.4064 - ci: 0.7171
2017-06-09 00:48:41,775 - Training step 1450/2000 |******************       | - loss: 1305.4064 - ci: 0.7171
2017-06-09 00:48:43,271 - Training step 1460/2000 |******************       | - loss: 1301.8946 - ci: 0.7166
2017-06-09 00:48:43,271 - Training step 1460/2000 |******************       | - loss: 1301.8946 - ci: 0.7166
2017-06-09 00:48:43,271 - Training step 1460/2000 |******************       | - loss: 1301.8946 - ci: 0.7166
2017-06-09 00:48:44,792 - Training step 1470/2000 |******************       | - loss: 1305.7367 - ci: 0.7164
2017-06-09 00:48:44,792 - Training step 1470/2000 |******************       | - loss: 1305.7367 - ci: 0.7164
2017-06-09 00:48:44,792 - Training step 1470/2000 |******************       | - loss: 1305.7367 - ci: 0.7164
2017-06-09 00:48:46,353 - Training step 1480/2000 |******************       | - loss: 1305.8437 - ci: 0.7167
2017-06-09 00:48:46,353 - Training step 1480/2000 |******************       | - loss: 1305.8437 - ci: 0.7167
2017-06-09 00:48:46,353 - Training step 1480/2000 |******************       | - loss: 1305.8437 - ci: 0.7167
2017-06-09 00:48:48,201 - Training step 1490/2000 |******************       | - loss: 1310.8111 - ci: 0.7165
2017-06-09 00:48:48,201 - Training step 1490/2000 |******************       | - loss: 1310.8111 - ci: 0.7165
2017-06-09 00:48:48,201 - Training step 1490/2000 |******************       | - loss: 1310.8111 - ci: 0.7165
2017-06-09 00:48:49,828 - Training step 1500/2000 |******************       | - loss: 1307.7474 - ci: 0.7174
2017-06-09 00:48:49,828 - Training step 1500/2000 |******************       | - loss: 1307.7474 - ci: 0.7174
2017-06-09 00:48:49,828 - Training step 1500/2000 |******************       | - loss: 1307.7474 - ci: 0.7174
2017-06-09 00:48:51,768 - Training step 1510/2000 |******************       | - loss: 1309.9571 - ci: 0.7168
2017-06-09 00:48:51,768 - Training step 1510/2000 |******************       | - loss: 1309.9571 - ci: 0.7168
2017-06-09 00:48:51,768 - Training step 1510/2000 |******************       | - loss: 1309.9571 - ci: 0.7168
2017-06-09 00:48:53,611 - Training step 1520/2000 |*******************      | - loss: 1301.3815 - ci: 0.7167
2017-06-09 00:48:53,611 - Training step 1520/2000 |*******************      | - loss: 1301.3815 - ci: 0.7167
2017-06-09 00:48:53,611 - Training step 1520/2000 |*******************      | - loss: 1301.3815 - ci: 0.7167
2017-06-09 00:48:55,207 - Training step 1530/2000 |*******************      | - loss: 1302.8541 - ci: 0.7171
2017-06-09 00:48:55,207 - Training step 1530/2000 |*******************      | - loss: 1302.8541 - ci: 0.7171
2017-06-09 00:48:55,207 - Training step 1530/2000 |*******************      | - loss: 1302.8541 - ci: 0.7171
2017-06-09 00:48:56,637 - Training step 1540/2000 |*******************      | - loss: 1307.4468 - ci: 0.7169
2017-06-09 00:48:56,637 - Training step 1540/2000 |*******************      | - loss: 1307.4468 - ci: 0.7169
2017-06-09 00:48:56,637 - Training step 1540/2000 |*******************      | - loss: 1307.4468 - ci: 0.7169
2017-06-09 00:48:58,191 - Training step 1550/2000 |*******************      | - loss: 1299.4765 - ci: 0.7171
2017-06-09 00:48:58,191 - Training step 1550/2000 |*******************      | - loss: 1299.4765 - ci: 0.7171
2017-06-09 00:48:58,191 - Training step 1550/2000 |*******************      | - loss: 1299.4765 - ci: 0.7171
2017-06-09 00:48:59,825 - Training step 1560/2000 |*******************      | - loss: 1306.8862 - ci: 0.7170
2017-06-09 00:48:59,825 - Training step 1560/2000 |*******************      | - loss: 1306.8862 - ci: 0.7170
2017-06-09 00:48:59,825 - Training step 1560/2000 |*******************      | - loss: 1306.8862 - ci: 0.7170
2017-06-09 00:49:01,263 - Training step 1570/2000 |*******************      | - loss: 1304.6097 - ci: 0.7171
2017-06-09 00:49:01,263 - Training step 1570/2000 |*******************      | - loss: 1304.6097 - ci: 0.7171
2017-06-09 00:49:01,263 - Training step 1570/2000 |*******************      | - loss: 1304.6097 - ci: 0.7171
2017-06-09 00:49:02,839 - Training step 1580/2000 |*******************      | - loss: 1304.6801 - ci: 0.7174
2017-06-09 00:49:02,839 - Training step 1580/2000 |*******************      | - loss: 1304.6801 - ci: 0.7174
2017-06-09 00:49:02,839 - Training step 1580/2000 |*******************      | - loss: 1304.6801 - ci: 0.7174
2017-06-09 00:49:04,410 - Training step 1590/2000 |*******************      | - loss: 1315.5168 - ci: 0.7171
2017-06-09 00:49:04,410 - Training step 1590/2000 |*******************      | - loss: 1315.5168 - ci: 0.7171
2017-06-09 00:49:04,410 - Training step 1590/2000 |*******************      | - loss: 1315.5168 - ci: 0.7171
2017-06-09 00:49:06,072 - Training step 1600/2000 |********************     | - loss: 1301.2437 - ci: 0.7172
2017-06-09 00:49:06,072 - Training step 1600/2000 |********************     | - loss: 1301.2437 - ci: 0.7172
2017-06-09 00:49:06,072 - Training step 1600/2000 |********************     | - loss: 1301.2437 - ci: 0.7172
2017-06-09 00:49:07,450 - Training step 1610/2000 |********************     | - loss: 1309.3214 - ci: 0.7173
2017-06-09 00:49:07,450 - Training step 1610/2000 |********************     | - loss: 1309.3214 - ci: 0.7173
2017-06-09 00:49:07,450 - Training step 1610/2000 |********************     | - loss: 1309.3214 - ci: 0.7173
2017-06-09 00:49:08,995 - Training step 1620/2000 |********************     | - loss: 1308.7008 - ci: 0.7171
2017-06-09 00:49:08,995 - Training step 1620/2000 |********************     | - loss: 1308.7008 - ci: 0.7171
2017-06-09 00:49:08,995 - Training step 1620/2000 |********************     | - loss: 1308.7008 - ci: 0.7171
2017-06-09 00:49:10,646 - Training step 1630/2000 |********************     | - loss: 1307.7787 - ci: 0.7174
2017-06-09 00:49:10,646 - Training step 1630/2000 |********************     | - loss: 1307.7787 - ci: 0.7174
2017-06-09 00:49:10,646 - Training step 1630/2000 |********************     | - loss: 1307.7787 - ci: 0.7174
2017-06-09 00:49:12,189 - Training step 1640/2000 |********************     | - loss: 1308.7579 - ci: 0.7171
2017-06-09 00:49:12,189 - Training step 1640/2000 |********************     | - loss: 1308.7579 - ci: 0.7171
2017-06-09 00:49:12,189 - Training step 1640/2000 |********************     | - loss: 1308.7579 - ci: 0.7171
2017-06-09 00:49:13,770 - Training step 1650/2000 |********************     | - loss: 1306.4091 - ci: 0.7175
2017-06-09 00:49:13,770 - Training step 1650/2000 |********************     | - loss: 1306.4091 - ci: 0.7175
2017-06-09 00:49:13,770 - Training step 1650/2000 |********************     | - loss: 1306.4091 - ci: 0.7175
2017-06-09 00:49:15,446 - Training step 1660/2000 |********************     | - loss: 1307.1218 - ci: 0.7172
2017-06-09 00:49:15,446 - Training step 1660/2000 |********************     | - loss: 1307.1218 - ci: 0.7172
2017-06-09 00:49:15,446 - Training step 1660/2000 |********************     | - loss: 1307.1218 - ci: 0.7172
2017-06-09 00:49:17,031 - Training step 1670/2000 |********************     | - loss: 1306.0028 - ci: 0.7173
2017-06-09 00:49:17,031 - Training step 1670/2000 |********************     | - loss: 1306.0028 - ci: 0.7173
2017-06-09 00:49:17,031 - Training step 1670/2000 |********************     | - loss: 1306.0028 - ci: 0.7173
2017-06-09 00:49:18,601 - Training step 1680/2000 |*********************    | - loss: 1303.7675 - ci: 0.7173
2017-06-09 00:49:18,601 - Training step 1680/2000 |*********************    | - loss: 1303.7675 - ci: 0.7173
2017-06-09 00:49:18,601 - Training step 1680/2000 |*********************    | - loss: 1303.7675 - ci: 0.7173
2017-06-09 00:49:20,115 - Training step 1690/2000 |*********************    | - loss: 1298.7003 - ci: 0.7173
2017-06-09 00:49:20,115 - Training step 1690/2000 |*********************    | - loss: 1298.7003 - ci: 0.7173
2017-06-09 00:49:20,115 - Training step 1690/2000 |*********************    | - loss: 1298.7003 - ci: 0.7173
2017-06-09 00:49:21,697 - Training step 1700/2000 |*********************    | - loss: 1301.8650 - ci: 0.7174
2017-06-09 00:49:21,697 - Training step 1700/2000 |*********************    | - loss: 1301.8650 - ci: 0.7174
2017-06-09 00:49:21,697 - Training step 1700/2000 |*********************    | - loss: 1301.8650 - ci: 0.7174
2017-06-09 00:49:23,154 - Training step 1710/2000 |*********************    | - loss: 1308.9804 - ci: 0.7174
2017-06-09 00:49:23,154 - Training step 1710/2000 |*********************    | - loss: 1308.9804 - ci: 0.7174
2017-06-09 00:49:23,154 - Training step 1710/2000 |*********************    | - loss: 1308.9804 - ci: 0.7174
2017-06-09 00:49:24,903 - Training step 1720/2000 |*********************    | - loss: 1309.4044 - ci: 0.7176
2017-06-09 00:49:24,903 - Training step 1720/2000 |*********************    | - loss: 1309.4044 - ci: 0.7176
2017-06-09 00:49:24,903 - Training step 1720/2000 |*********************    | - loss: 1309.4044 - ci: 0.7176
2017-06-09 00:49:26,473 - Training step 1730/2000 |*********************    | - loss: 1314.2817 - ci: 0.7175
2017-06-09 00:49:26,473 - Training step 1730/2000 |*********************    | - loss: 1314.2817 - ci: 0.7175
2017-06-09 00:49:26,473 - Training step 1730/2000 |*********************    | - loss: 1314.2817 - ci: 0.7175
2017-06-09 00:49:27,921 - Training step 1740/2000 |*********************    | - loss: 1309.6833 - ci: 0.7178
2017-06-09 00:49:27,921 - Training step 1740/2000 |*********************    | - loss: 1309.6833 - ci: 0.7178
2017-06-09 00:49:27,921 - Training step 1740/2000 |*********************    | - loss: 1309.6833 - ci: 0.7178
2017-06-09 00:49:29,498 - Training step 1750/2000 |*********************    | - loss: 1298.7552 - ci: 0.7172
2017-06-09 00:49:29,498 - Training step 1750/2000 |*********************    | - loss: 1298.7552 - ci: 0.7172
2017-06-09 00:49:29,498 - Training step 1750/2000 |*********************    | - loss: 1298.7552 - ci: 0.7172
2017-06-09 00:49:30,982 - Training step 1760/2000 |**********************   | - loss: 1306.7221 - ci: 0.7171
2017-06-09 00:49:30,982 - Training step 1760/2000 |**********************   | - loss: 1306.7221 - ci: 0.7171
2017-06-09 00:49:30,982 - Training step 1760/2000 |**********************   | - loss: 1306.7221 - ci: 0.7171
2017-06-09 00:49:32,477 - Training step 1770/2000 |**********************   | - loss: 1303.1513 - ci: 0.7173
2017-06-09 00:49:32,477 - Training step 1770/2000 |**********************   | - loss: 1303.1513 - ci: 0.7173
2017-06-09 00:49:32,477 - Training step 1770/2000 |**********************   | - loss: 1303.1513 - ci: 0.7173
2017-06-09 00:49:33,820 - Training step 1780/2000 |**********************   | - loss: 1306.6945 - ci: 0.7175
2017-06-09 00:49:33,820 - Training step 1780/2000 |**********************   | - loss: 1306.6945 - ci: 0.7175
2017-06-09 00:49:33,820 - Training step 1780/2000 |**********************   | - loss: 1306.6945 - ci: 0.7175
2017-06-09 00:49:35,320 - Training step 1790/2000 |**********************   | - loss: 1306.0719 - ci: 0.7182
2017-06-09 00:49:35,320 - Training step 1790/2000 |**********************   | - loss: 1306.0719 - ci: 0.7182
2017-06-09 00:49:35,320 - Training step 1790/2000 |**********************   | - loss: 1306.0719 - ci: 0.7182
2017-06-09 00:49:36,922 - Training step 1800/2000 |**********************   | - loss: 1311.2468 - ci: 0.7175
2017-06-09 00:49:36,922 - Training step 1800/2000 |**********************   | - loss: 1311.2468 - ci: 0.7175
2017-06-09 00:49:36,922 - Training step 1800/2000 |**********************   | - loss: 1311.2468 - ci: 0.7175
2017-06-09 00:49:38,366 - Training step 1810/2000 |**********************   | - loss: 1295.1567 - ci: 0.7181
2017-06-09 00:49:38,366 - Training step 1810/2000 |**********************   | - loss: 1295.1567 - ci: 0.7181
2017-06-09 00:49:38,366 - Training step 1810/2000 |**********************   | - loss: 1295.1567 - ci: 0.7181
2017-06-09 00:49:39,810 - Training step 1820/2000 |**********************   | - loss: 1302.8696 - ci: 0.7175
2017-06-09 00:49:39,810 - Training step 1820/2000 |**********************   | - loss: 1302.8696 - ci: 0.7175
2017-06-09 00:49:39,810 - Training step 1820/2000 |**********************   | - loss: 1302.8696 - ci: 0.7175
2017-06-09 00:49:41,212 - Training step 1830/2000 |**********************   | - loss: 1295.8949 - ci: 0.7174
2017-06-09 00:49:41,212 - Training step 1830/2000 |**********************   | - loss: 1295.8949 - ci: 0.7174
2017-06-09 00:49:41,212 - Training step 1830/2000 |**********************   | - loss: 1295.8949 - ci: 0.7174
2017-06-09 00:49:42,525 - Training step 1840/2000 |***********************  | - loss: 1306.5958 - ci: 0.7175
2017-06-09 00:49:42,525 - Training step 1840/2000 |***********************  | - loss: 1306.5958 - ci: 0.7175
2017-06-09 00:49:42,525 - Training step 1840/2000 |***********************  | - loss: 1306.5958 - ci: 0.7175
2017-06-09 00:49:43,972 - Training step 1850/2000 |***********************  | - loss: 1309.5176 - ci: 0.7177
2017-06-09 00:49:43,972 - Training step 1850/2000 |***********************  | - loss: 1309.5176 - ci: 0.7177
2017-06-09 00:49:43,972 - Training step 1850/2000 |***********************  | - loss: 1309.5176 - ci: 0.7177
2017-06-09 00:49:45,440 - Training step 1860/2000 |***********************  | - loss: 1310.6745 - ci: 0.7176
2017-06-09 00:49:45,440 - Training step 1860/2000 |***********************  | - loss: 1310.6745 - ci: 0.7176
2017-06-09 00:49:45,440 - Training step 1860/2000 |***********************  | - loss: 1310.6745 - ci: 0.7176
2017-06-09 00:49:46,839 - Training step 1870/2000 |***********************  | - loss: 1308.0200 - ci: 0.7178
2017-06-09 00:49:46,839 - Training step 1870/2000 |***********************  | - loss: 1308.0200 - ci: 0.7178
2017-06-09 00:49:46,839 - Training step 1870/2000 |***********************  | - loss: 1308.0200 - ci: 0.7178
2017-06-09 00:49:48,448 - Training step 1880/2000 |***********************  | - loss: 1307.3695 - ci: 0.7170
2017-06-09 00:49:48,448 - Training step 1880/2000 |***********************  | - loss: 1307.3695 - ci: 0.7170
2017-06-09 00:49:48,448 - Training step 1880/2000 |***********************  | - loss: 1307.3695 - ci: 0.7170
2017-06-09 00:49:50,024 - Training step 1890/2000 |***********************  | - loss: 1304.3683 - ci: 0.7175
2017-06-09 00:49:50,024 - Training step 1890/2000 |***********************  | - loss: 1304.3683 - ci: 0.7175
2017-06-09 00:49:50,024 - Training step 1890/2000 |***********************  | - loss: 1304.3683 - ci: 0.7175
2017-06-09 00:49:51,505 - Training step 1900/2000 |***********************  | - loss: 1301.3851 - ci: 0.7171
2017-06-09 00:49:51,505 - Training step 1900/2000 |***********************  | - loss: 1301.3851 - ci: 0.7171
2017-06-09 00:49:51,505 - Training step 1900/2000 |***********************  | - loss: 1301.3851 - ci: 0.7171
2017-06-09 00:49:53,285 - Training step 1910/2000 |***********************  | - loss: 1302.7624 - ci: 0.7171
2017-06-09 00:49:53,285 - Training step 1910/2000 |***********************  | - loss: 1302.7624 - ci: 0.7171
2017-06-09 00:49:53,285 - Training step 1910/2000 |***********************  | - loss: 1302.7624 - ci: 0.7171
2017-06-09 00:49:55,362 - Training step 1920/2000 |************************ | - loss: 1312.6177 - ci: 0.7172
2017-06-09 00:49:55,362 - Training step 1920/2000 |************************ | - loss: 1312.6177 - ci: 0.7172
2017-06-09 00:49:55,362 - Training step 1920/2000 |************************ | - loss: 1312.6177 - ci: 0.7172
2017-06-09 00:49:57,242 - Training step 1930/2000 |************************ | - loss: 1306.5119 - ci: 0.7171
2017-06-09 00:49:57,242 - Training step 1930/2000 |************************ | - loss: 1306.5119 - ci: 0.7171
2017-06-09 00:49:57,242 - Training step 1930/2000 |************************ | - loss: 1306.5119 - ci: 0.7171
2017-06-09 00:49:58,764 - Training step 1940/2000 |************************ | - loss: 1308.2971 - ci: 0.7171
2017-06-09 00:49:58,764 - Training step 1940/2000 |************************ | - loss: 1308.2971 - ci: 0.7171
2017-06-09 00:49:58,764 - Training step 1940/2000 |************************ | - loss: 1308.2971 - ci: 0.7171
2017-06-09 00:50:00,450 - Training step 1950/2000 |************************ | - loss: 1302.4246 - ci: 0.7174
2017-06-09 00:50:00,450 - Training step 1950/2000 |************************ | - loss: 1302.4246 - ci: 0.7174
2017-06-09 00:50:00,450 - Training step 1950/2000 |************************ | - loss: 1302.4246 - ci: 0.7174
2017-06-09 00:50:02,139 - Training step 1960/2000 |************************ | - loss: 1315.1864 - ci: 0.7175
2017-06-09 00:50:02,139 - Training step 1960/2000 |************************ | - loss: 1315.1864 - ci: 0.7175
2017-06-09 00:50:02,139 - Training step 1960/2000 |************************ | - loss: 1315.1864 - ci: 0.7175
2017-06-09 00:50:03,562 - Training step 1970/2000 |************************ | - loss: 1312.8696 - ci: 0.7174
2017-06-09 00:50:03,562 - Training step 1970/2000 |************************ | - loss: 1312.8696 - ci: 0.7174
2017-06-09 00:50:03,562 - Training step 1970/2000 |************************ | - loss: 1312.8696 - ci: 0.7174
2017-06-09 00:50:05,144 - Training step 1980/2000 |************************ | - loss: 1304.4548 - ci: 0.7171
2017-06-09 00:50:05,144 - Training step 1980/2000 |************************ | - loss: 1304.4548 - ci: 0.7171
2017-06-09 00:50:05,144 - Training step 1980/2000 |************************ | - loss: 1304.4548 - ci: 0.7171
2017-06-09 00:50:06,791 - Training step 1990/2000 |************************ | - loss: 1306.8314 - ci: 0.7178
2017-06-09 00:50:06,791 - Training step 1990/2000 |************************ | - loss: 1306.8314 - ci: 0.7178
2017-06-09 00:50:06,791 - Training step 1990/2000 |************************ | - loss: 1306.8314 - ci: 0.7178
2017-06-09 00:50:08,133 - Finished Training with 2000 iterations in 313.59s
2017-06-09 00:50:08,133 - Finished Training with 2000 iterations in 313.59s
2017-06-09 00:50:08,133 - Finished Training with 2000 iterations in 313.59s

There are two different ways to visualzie how the model trained:

  • Tensorboard (install ()[tensorboard]) which provides realtime metrics. Run the command in shell:

    tensorboard --logdir './logs/tensorboard'

  • Visualize the training functions post training (below)

In [32]:
# Print the final metrics
print('Train C-Index:', metrics['c-index'][-1])
# print('Valid C-Index: ',metrics['valid_c-index'][-1])

# Plot the training / validation curves
viz.plot_log(metrics)


Train C-Index: (1999, 0.71723313860860327)